Skip to content

Skip argmin/argmax with dim=None in CoreML partitioner#19247

Open
john-rocky wants to merge 2 commits intopytorch:mainfrom
john-rocky:coreml/skip-argmax-dim-none
Open

Skip argmin/argmax with dim=None in CoreML partitioner#19247
john-rocky wants to merge 2 commits intopytorch:mainfrom
john-rocky:coreml/skip-argmax-dim-none

Conversation

@john-rocky
Copy link
Copy Markdown

Summary

argmax(x, dim=None) / argmin(x, dim=None) reduces over the flattened
tensor. CoreML does not support this reduction, and the resulting model
intermittently crashes the process at runtime (the issue reproducer
crashes 100% of the time on M1 Pro when the cell is run twice).

Detect the dim is None case in should_override_support so the op
falls back to the portable backend. The ordinary dim=int form is
unaffected and still gets delegated.

Fixes #11715.

Test plan

Added test_argmax_argmin_dim_none_is_skipped covering both branches:

  • argmax(x, dim=None) + argmin(x, dim=None) — neither op is delegated.
  • argmax(x, dim=1) — gets delegated as before.
$ python -m unittest -v executorch.backends.apple.coreml.test.test_coreml_partitioner.TestCoreMLPartitioner.test_argmax_argmin_dim_none_is_skipped
Ran 1 test in 1.042s

OK

Authored with Claude.

argmax/argmin with dim=None reduces over the flattened input, which
CoreML does not support and which intermittently crashes the process
at runtime.  Reject these in the partitioner so they fall back to
the portable backend; the ordinary dim=int form is still delegated.

Fixes pytorch#11715.
@john-rocky john-rocky requested a review from shoumikhin as a code owner May 1, 2026 04:48
@pytorch-bot
Copy link
Copy Markdown

pytorch-bot Bot commented May 1, 2026

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/19247

Note: Links to docs will display an error until the docs builds have been completed.

⚠️ 12 Awaiting Approval

As of commit 2fe428f with merge base 94d2881 (image):

AWAITING APPROVAL - The following workflows need approval before CI can run:

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@meta-cla
Copy link
Copy Markdown

meta-cla Bot commented May 1, 2026

Hi @john-rocky!

Thank you for your pull request and welcome to our community.

Action Required

In order to merge any pull request (code, docs, etc.), we require contributors to sign our Contributor License Agreement, and we don't seem to have one on file for you.

Process

In order for us to review and merge your suggested changes, please sign at https://code.facebook.com/cla. If you are contributing on behalf of someone else (eg your employer), the individual CLA may not be sufficient and your employer may need to sign the corporate CLA.

Once the CLA is signed, our tooling will perform checks and validations. Afterwards, the pull request will be tagged with CLA signed. The tagging process may take up to 1 hour after signing. Please give it that time before contacting us about it.

If you have received this in error or have any questions, please contact us at cla@meta.com. Thanks!

@github-actions
Copy link
Copy Markdown

github-actions Bot commented May 1, 2026

This PR needs a release notes: label

If your change should be included in the release notes (i.e. would users of this library care about this change?), please use a label starting with release notes:. This helps us keep track and include your important work in the next release notes.

To add a label, you can comment to pytorchbot, for example
@pytorchbot label "release notes: none"

For more information, see
https://github.com/pytorch/pytorch/wiki/PyTorch-AutoLabel-Bot#why-categorize-for-release-notes-and-how-does-it-work.

@meta-cla meta-cla Bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label May 1, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

CoreML argmin/argmax intermittently crashes process

1 participant